Asymptotic Level Density of the Elastic Net Self-Organizing Feature Map

نویسندگان

  • Jens Christian Claussen
  • Heinz G. Schuster
چکیده

Whileas the Kohonen Self Organizing Map shows an asymptotic level density following a power law with a magnification exponent 2/3, it would be desired to have an exponent 1 in order to provide optimal mapping in the sense of information theory. In this paper, we study analytically and numerically the magnification behaviour of the Elastic Net algorithm as a model for self-organizing feature maps. In contrast to the Kohonen map the Elastic Net shows no power law, but for onedimensional maps nevertheless the density follows an universal magnification law, i.e. depends on the local stimulus density only and is independent on position and decouples from the stimulus density at other positions. Self Organizing Feature Maps map an input space, such as the retina or skin receptor fields, into a neural layer by feedforward structures with lateral inhibition. Biological maps show as defining properties topology preservation, error tolerance, plasticity (the ability of adaptation to changes in input space), and self-organized formation by a local process, since the global structure cannot be coded genetically. The self-organizing feature map algorithm proposed by Kohonen [1] has become a successful model for topology preserving primary sensory processing in the cortex [2], and an useful tool in technical applications [3]. The Kohonen algorithm for Self Organizing Feature Maps is defined as follows: Every stimulus v of an euclidian input space V is mapped to the neuron with the position s in the neural layer R with the highest neural activity, given by the condition |ws − v| = minr∈R |wr − v| (1) where |.| denotes the euclidian distance in input space. In the Kohonen model the learning rule for each synaptic weight vector wr is given by w new r = w r + η · grs · (v − w r ) (2) with grs as a gaussian function of euclidian distance |r − s| in the neural layer. The function grs describes the topology in the neural layer. The parameter η determines the speed of learning and can be adjusted during the learning process. Topology preservation is enforced by the common update of all weight vectors whose neuron r is adjacent to the center of excitation s.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Development of oriented ocular dominance bands as a consequence of areal geometry

It has been hypothesized that the different appearance of ocular dominance bands in the cat and the monkey is a consequence of the different mapping geometries in these species (LeVay et al. 1985; Anderson et al. 1988). Here I investigate the impact of areal geometries on the preferred direction of ocular dominance bands in two adaptive map formation models, the self-organizing feature map and ...

متن کامل

EM Algorithms for Self-Organizing Maps

Self-organizing maps are popular algorithms for unsupervised learning and data visualization. Exploiting the link between vector quantization and mixture modeling, we derive EM algorithms for self-organizing maps with and without missing values. We compare self-organizing maps with the elastic-net approach and explain why the former is better suited for the visualization of high-dimensional dat...

متن کامل

A Fast Kohonen Net Implementation for Spert-II

We present an implementation of Kohonen Self-Organizing Feature Maps for the Spert-II vector microprocessor system. The implementation supports arbitrary neural map topologies and arbitrary neighborhood functions. For small networks, as used in real-world tasks, a single Spert-II board is measured to run Kohonen net classiication at up to 208 million connections per second (MCPS). On a speech c...

متن کامل

Holistic Farsi handwritten word recognition using gradient features

In this paper we address the issue of recognizing Farsi handwritten words. Two types of gradient features are extracted from a sliding vertical stripe which sweeps across a word image. These are directional and intensity gradient features. The feature vector extracted from each stripe is then coded using the Self Organizing Map (SOM). In this method each word is modeled using the discrete Hidde...

متن کامل

Landforms identification using neural network-self organizing map and SRTM data

During an 11 days mission in February 2000 the Shuttle Radar Topography Mission (SRTM) collected data over 80% of the Earth's land surface, for all areas between 60 degrees N and 56 degrees S latitude. Since SRTM data became available, many studies utilized them for application in topography and morphometric landscape analysis. Exploiting SRTM data for recognition and extraction of topographic ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002